Half-tapering strategy for conditional simulation with large datasets
نویسندگان
چکیده
منابع مشابه
Covariance Tapering for Likelihood Based Estimation in Large Spatial Datasets
Maximum likelihood is an attractive method of estimating covariance parameters in spatial models based on Gaussian processes. However, calculating the likelihood can be computationally infeasible for large datasets, requiring O(n3) calculations for a dataset with n observations. This article proposes the method of covariance tapering to approximate the likelihood in this setting. In this approa...
متن کاملCovariance Tapering for Interpolation of Large Spatial Datasets
Interpolation of a spatially correlated random process is used in many areas. The best unbiased linear predictor, often called kriging predictor in geostatistical science, requires the solution of a large linear system based on the covariance matrix of the observations. In this article, we show that tapering the correct covariance matrix with an appropriate compactly supported covariance functi...
متن کاملPosterior Simulation in Countable Mixture Models for Large Datasets
Mixture models, or convex combinations of a countable number of probability distributions, offer an elegant framework for inference when the population of interest can be subdivided into latent clusters having random characteristics that are heterogeneous between, but homogeneous within, the clusters. Traditionally, the different kinds of mixture models have been motivated and analyzed from ver...
متن کاملMachine Learning with Large Datasets
This paper introduces new algorithms and data structures for quick counting for machine learning datasets. We focus on the counting task of constructing contingency tables, but our approach is also applicable to counting the number of records in a dataset that match conjunctive queries. Subject to certain assumptions, the costs of these operations can be shown to be independent of the number of...
متن کاملInteractive Simulation Steering in VR and Handling of large Datasets
The growing complexity and sheer amount of data resulting from supercomputer simulations increases the need for new and intuitive analysis techniques such as VR. In this paper we present techniques that try to meet the demands of data analysis in a high performance computing environment. After discussing the requirements for such a system in respect to flexibility and handling of large data, we...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Stochastic Environmental Research and Risk Assessment
سال: 2017
ISSN: 1436-3240,1436-3259
DOI: 10.1007/s00477-017-1386-z